29 research outputs found

    Aspect-based approach to modeling access control policies, An

    Get PDF
    Department Head: L. Darrell Whitley.2007 Spring.Includes bibliographical references (pages 119-126).Access control policies determine how sensitive information and computing resources are to be protected. Enforcing these policies in a system design typically results in access control features that crosscut the dominant structure of the design (that is, features that are spread across and intertwined with other features in the design). The spreading and intertwining of access control features make it difficult to understand, analyze, and change them and thus complicate the task of ensuring that an evolving design continues to enforce access control policies. Researchers have advocated the use of aspect-oriented modeling (AOM) techniques for addressing the problem of evolving crosscutting features. This dissertation proposes an approach to modeling and analyzing crosscutting access control features. The approach utilizes AOM techniques to isolate crosscutting access control features as patterns described by aspect models. Incorporating an access control feature into a design involves embedding instantiated forms of the access control pattern into the design model. When composing instantiated access control patterns with a design model, one needs to ensure that the resulting composed model enforces access control policies. The approach includes a technique to verify that specified policies are enforced in the composed model. The approach is illustrated using two well-known access control models: the Role- Based Access Control (RBAC) model and the Bell-LaPadula (BLP) model. Features that enforce RBAC and BLP models are described by aspect models. We show how the aspect models can be composed to create a new hybrid access control aspect model. We also show how one can verify that composition of a base (primary) design model and an aspect model that enforces specified policies produces a composed model in which the policies are still enforced

    Towards priority-awareness in autonomous intelligent systems

    Get PDF
    In Autonomous and Intelligent systems (AIS), the decision-making process can be divided into two parts: (i) the priorities of the requirements are determined at design-time; (ii) design selection follows where alternatives are compared, and the preferred alternatives are chosen autonomously by the AIS. Runtime design selection is a trade-off analysis between non-functional requirements (NFRs) that uses optimisation methods, including decision-analysis and utility theory. The aim is to select the design option yielding the highest expected utility. A problem with these techniques is that they use a uni-scalar cumulative utility value to represent a combined priority for all the NFRs. However, this uni-scalar value doesn't give information about the varying impacts of actions under uncertain environmental contexts on the satisfaction priorities of individual NFRs. In this paper, we present a novel use of Multi-Reward Partially Observable Markov Decision Process (MR-POMDP) to support reasoning of separate NFR priorities. We discuss the use of rewards in MR-POMDPs as a way to support AIS with (a) priority-aware decision-making; and (b) maintain service-level agreement, by autonomously tuning NFRs' priorities to new contexts and based on data gathered at runtime. We evaluate our approach by applying it to a substantial Network case

    Towards an architecture integrating complex event processing and temporal graphs for service monitoring

    Get PDF
    Software is becoming more complex as it needs to deal with an increasing number of aspects in volatile environments. This complexity may cause behaviors that violate the imposed constraints. A goal of runtime service monitoring is to determine whether the service behaves as intended to potentially allow the correction of the behavior. It may be set up in advance the infrastructure to allow the detections of suspicious situations. However, there may also be unexpected situations to look for as they only become evident during data stream monitoring at runtime produced by te system. The access to historic data may be key to detect relevant situations in the monitoring infrastructure. Available technologies used for monitoring offer different trade-offs, e.g. in cost and flexibility to store historic information. For instance, Temporal Graphs (TGs) can store the long-term history of an evolving system for future querying, at the expense of disk space and processing time. In contrast, Complex Event Processing (CEP) can quickly react to incoming situations efficiently, as long as the appropriate event patterns have been set up in advance. This paper presents an architecture that integrates CEP and TGs for service monitoring through the data stream produced at runtime by a system. The pros and cons of the proposed architecture for extracting and treating the monitored data are analyzed. The approach is applied on the monitoring of Quality of Service (QoS) of a data-management network case study. It is demonstrated how the architecture provides rapid detection of issues, as well as the ability to access to historical data about the state of the system to allow for a comprehensive monitoring solution

    Prediction of overall survival for patients with metastatic castration-resistant prostate cancer : development of a prognostic model through a crowdsourced challenge with open clinical trial data

    Get PDF
    Background Improvements to prognostic models in metastatic castration-resistant prostate cancer have the potential to augment clinical trial design and guide treatment strategies. In partnership with Project Data Sphere, a not-for-profit initiative allowing data from cancer clinical trials to be shared broadly with researchers, we designed an open-data, crowdsourced, DREAM (Dialogue for Reverse Engineering Assessments and Methods) challenge to not only identify a better prognostic model for prediction of survival in patients with metastatic castration-resistant prostate cancer but also engage a community of international data scientists to study this disease. Methods Data from the comparator arms of four phase 3 clinical trials in first-line metastatic castration-resistant prostate cancer were obtained from Project Data Sphere, comprising 476 patients treated with docetaxel and prednisone from the ASCENT2 trial, 526 patients treated with docetaxel, prednisone, and placebo in the MAINSAIL trial, 598 patients treated with docetaxel, prednisone or prednisolone, and placebo in the VENICE trial, and 470 patients treated with docetaxel and placebo in the ENTHUSE 33 trial. Datasets consisting of more than 150 clinical variables were curated centrally, including demographics, laboratory values, medical history, lesion sites, and previous treatments. Data from ASCENT2, MAINSAIL, and VENICE were released publicly to be used as training data to predict the outcome of interest-namely, overall survival. Clinical data were also released for ENTHUSE 33, but data for outcome variables (overall survival and event status) were hidden from the challenge participants so that ENTHUSE 33 could be used for independent validation. Methods were evaluated using the integrated time-dependent area under the curve (iAUC). The reference model, based on eight clinical variables and a penalised Cox proportional-hazards model, was used to compare method performance. Further validation was done using data from a fifth trial-ENTHUSE M1-in which 266 patients with metastatic castration-resistant prostate cancer were treated with placebo alone. Findings 50 independent methods were developed to predict overall survival and were evaluated through the DREAM challenge. The top performer was based on an ensemble of penalised Cox regression models (ePCR), which uniquely identified predictive interaction effects with immune biomarkers and markers of hepatic and renal function. Overall, ePCR outperformed all other methods (iAUC 0.791; Bayes factor >5) and surpassed the reference model (iAUC 0.743; Bayes factor >20). Both the ePCR model and reference models stratified patients in the ENTHUSE 33 trial into high-risk and low-risk groups with significantly different overall survival (ePCR: hazard ratio 3.32, 95% CI 2.39-4.62, p Interpretation Novel prognostic factors were delineated, and the assessment of 50 methods developed by independent international teams establishes a benchmark for development of methods in the future. The results of this effort show that data-sharing, when combined with a crowdsourced challenge, is a robust and powerful framework to develop new prognostic models in advanced prostate cancer.Peer reviewe

    Effect of hand loads on upper extremity muscle activity during pushing and pulling motions

    No full text
    Manual pushing or pulling with a hand tool is a coordinated action by various upper extremity muscles. The objective of this experimental study was to examine the effects of horizontal and vertical hand loads on upper extremity muscle activity during concentric pushing and pulling exertions. Twenty young female participants conducted repetitive pushing and pulling trials with three horizontal loads (1 kg, 2 kg, 3 kg) and two vertical loads (0.6 kg, 1.3 kg) in a seated posture, while the myoelectric activity of seven upper extremity and shoulder muscles were quantified. Study results indicate that the shoulder flexor and extensor muscles were more strongly associated with horizontal load, and elbow flexors were more sensitive to vertical load. The empirical data from this systematic evaluation can offer initial insights for ergonomic design and evaluation of hand tools or occupational tasks that involve repetitive pushing or pulling

    Toward an Integrated Tool Environment for Static Analysis of UML Class and Sequence Models

    No full text
    There is a need for more rigorous analysis techniques that developers can use for verifying the critical properties in UML models. The UML-based Specification Environment (USE) tool supports verification of invariants, preconditions, and postconditions specified in the Object Constraint Language (OCL). Due to its animation and analysis power, it is useful when checking critical non-functional properties such as security policies. However, the USE requires one to specify a model using its own textual language and does not allow one to import any model specification files created by other UML modeling tools. Hence, you would create a model with OCL constraints using a modeling tool such as the IBM Rational Software Architect (RSA) and then use the USE for the model verification. This approach, however, requires a manual transformation between two different specification formats, which diminishes advantage of using tools for model-level verification. In this paper, we describe our own implementation of a specification transformation engine based on the Model-Driven Architecture (MDA) framework. Our approach currently supports automatic tool-level transformations to USE from UML modeling tools built on the Eclipse-based Modeling Framework (EMF)
    corecore